X’s child porn detection system doesn’t violate an Illinois biometric privacy law, the judge ruled.
A federal judge dismissed a lawsuit concerning the software X (formerly Twitter) uses to find illegal porn images. The suit was brought by Mark Martell, who objected to X using Microsoft’s PhotoDNA software.
Martell argued that PhotoDNA—which is used across the tech industry to detect and report child porn—required the collection of biometric data and that this collection violated Illinois’ Biometric Information Privacy Act (BIPA).
A win for Martell could have imperiled the use of PhotoDNA and similar software by all sorts of tech companies, thwarting tools that have proved useful in fighting sexually explicit images of minors, non-consensual adult pornography (a.k.a. “revenge porn”), terroristic images, and extremely violent content. Tech companies voluntarily employ these tools in a way that seems minimally invasive to the privacy of your average user—no biometric data collection actually required.
So, while “dude loses biometric privacy suit against big tech” may seem on its surface like a sad story, it’s probably good news that U.S. District Judge Sunil R. Harjani granted X’s motion to dismiss the case.
Here we see a judge protecting big tech capitalism, the registry cottage industry and thus insuring that all the court and jailhouse doors remain squeak-free of rust.
This is the correct call by the court. It would be a technological absurdity to make it illegal to hash images and compare them against a list of known hashes. It’s a core piece of computing, and involves no biometrics, as the article points out.
Any computer file can be thought of as a binary string, a very long sequence of 1s and 0s. A cryptographic hash function (CHF) takes such a string and maps it to a fixed-length string (like 256 bits, which is very common) with some desirable properties for use as identification, such as being functionally impossible to reverse, a given output having a very low probability, it being highly unlikely two input files will have the same output (called a collision), etc. It’s not really a “secret signature” (to use the article’s phrase) because it’s not some secret extra data, but a basic mathematical consequence of how digital computing works. Hashing is widely used for file integrity monitoring, malware identification, etc.
Indeed, you can get an image’s hash right now, if you’re familiar with command-line usage. Open a terminal on macOS/Linux/Windows 11 (or powershell on Windows 10 or earlier), navigate to a directory with images, and run “sha256sum <filename>” (macOS/Linux) or “Get-FileHash <filename>” (Windows). This will give you the unique SHA256 hash for that image.
Notice how all personal data concerns brought on by big tech or the government is always packaged and presented as combating CP. Neat little trick to get people to overlook the potential (even likely) abuses of such procedures/software. And apparently a pretty effective one.
Come to think of it, shouldn’t those companies and the individuals that work in those specific departments (FBI included) be indicted for CP possession? How exactly are these companies/agents not “re-victimizing” those depicted if that is accomplished by simple viewing, as the public has been led to believe?
Crimes like CP distribution/viewing happen in privacy…you want privacy, this is the price. Combat CP by other means, or sacrifice all privacy to end it through universal, inescapable surveillance.